Skip to content

Added Vercel AI Gateway as a provider#11

Merged
alexzhang13 merged 4 commits intoalexzhang13:mainfrom
jerilynzheng:add-provider-vercel-ai-gateway
Jan 7, 2026
Merged

Added Vercel AI Gateway as a provider#11
alexzhang13 merged 4 commits intoalexzhang13:mainfrom
jerilynzheng:add-provider-vercel-ai-gateway

Conversation

@jerilynzheng
Copy link
Contributor

Summary

Added Vercel AI Gateway as a provider using OpenAI-compatible API

Changes

  • Added "vercel" to ClientBackend type in rlm/core/types.py
  • Added AI_GATEWAY_API_KEY environment variable support in rlm/clients/openai.py
  • Added "vercel" backend routing in rlm/clients/init.py

@alexzhang13
Copy link
Owner

@jerilynzheng Thanks! Can you just check one thing for me -- someone noticed earlier that with Prime Intellect's inference on the OAI API, there was some mismatch for cost tracking. Can you just check that with Vercel (I don't have an API key so I can't test this) that cost tracking works (e.g. when you run the RLM example, can you print the RLMChatCompletion object and send a screenshot here? Just make sure that the costs it reports are correct).

@tosinamuda
Copy link

Hey @alexzhang13 are you thinking of using dspy for managing llm inference given it already has integration with the different provider through litellm

@alexzhang13
Copy link
Owner

Hey @alexzhang13 are you thinking of using dspy for managing llm inference given it already has integration with the different provider through litellm

Haven't though of this yet, could be convinced to add it though.

@alexzhang13 alexzhang13 merged commit bd92fc6 into alexzhang13:main Jan 7, 2026
@jerilynzheng
Copy link
Contributor Author

HI @alexzhang13 , sorry for the delay! Here's the screenshot
Screenshot 2026-01-22 at 9 13 36 PM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants